Α-mutual Information

نویسنده

  • Sergio Verdú
چکیده

Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several interesting properties and applications of the proposal by Sibson, hopefully making a case for its more widespread adoption. I. RÉNYI ENTROPY AND RÉNYI DIVERGENCE In 1960, Alfred Rényi [25] re-examined Shannon’s axiomatic approach to the definition of entropy by replacing the chain rule by the weaker requirement that the randomness measure of the product of two discrete distributions be equal to the sum of the corresponding randomness measures. The resulting measure is the following. Definition 1. For a discrete random variable X , the Rényi entropy of order α ∈ [0,∞] is defined as Hα(X) =  log |{x ∈ A : PX(x) > 0}| α = 0

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Computation of Csiszár's mutual Information of order α

Csiszár introduced the mutual information of order α in [1] as a parameterized version of Shannon’s mutual information. It involves a minimization over the probability space, and it cannot be computed in closed form. An alternating minimization algorithm for its computation is presented in this paper, along with a proof of its convergence. Furthermore, it is proved that the algorithm is an inst...

متن کامل

Independent Component Analysis Using Convex Divergence

The convex divergence is used as a surrogate function for obtaining a class of ICA algorithms (Independent Component Analysis) called the f-ICA. The convex divergence is a super class of α-divergence, which is a further upper family of Kullback-Leibler divergence or mutual information. Therefore, the f-ICA contains the α-ICA and the minimum mutual information ICA. In addition to theoretical int...

متن کامل

Convexity/concavity of renyi entropy and α-mutual information

Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...

متن کامل

Renyi generalizations of the conditional quantum mutual information

The conditional quantum mutual information I(A;B|C) of a tripartite state ρABC is an information quantity which lies at the center of many problems in quantum information theory. Three of its main properties are that it is non-negative for any tripartite state, that it decreases under local operations applied to systems A and B, and that it obeys the duality relation I(A;B|C) = I(A;B|D) for a f...

متن کامل

Rényi information transfer: Partial rényi transfer entropy and partial rényi mutual information

Shannon and Rényi information theory have been applied to coupling estimation in complex systems using time series of their dynamical states. By analysing how information is transferred between constituent parts of a complex system, it is possible to infer the coupling parameters of the system. To this end, we introduce the partial Rényi transfer entropy and we give an alternative derivation of...

متن کامل

Sharp Bounds Between Two Rényi Entropies of Distinct Positive Orders

Many axiomatic definitions of entropy, such as the Rényi entropy, of a random variable are closely related to the `α-norm of its probability distribution. This study considers probability distributions on finite sets, and examines the sharp bounds of the `β-norm with a fixed `α-norm, α 6= β, for n-dimensional probability vectors with an integer n ≥ 2. From the results, we derive the sharp bound...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015